316 research outputs found

    Computability of Julia sets

    Full text link
    In this paper we settle most of the open questions on algorithmic computability of Julia sets. In particular, we present an algorithm for constructing quadratics whose Julia sets are uncomputable. We also show that a filled Julia set of a polynomial is always computable.Comment: Revised. To appear in Moscow Math. Journa

    Simulating Noisy Channel Interaction

    Full text link
    We show that TT rounds of interaction over the binary symmetric channel BSC1/2−ϵBSC_{1/2-\epsilon} with feedback can be simulated with O(ϵ2T)O(\epsilon^2 T) rounds of interaction over a noiseless channel. We also introduce a more general "energy cost" model of interaction over a noisy channel. We show energy cost to be equivalent to external information complexity, which implies that our simulation results are unlikely to carry over to energy complexity. Our main technical innovation is a self-reduction from simulating a noisy channel to simulating a slightly-less-noisy channel, which may have other applications in the area of interactive compression

    The Role of Randomness and Noise in Strategic Classification

    Get PDF
    We investigate the problem of designing optimal classifiers in the strategic classification setting, where the classification is part of a game in which players can modify their features to attain a favorable classification outcome (while incurring some cost). Previously, the problem has been considered from a learning-theoretic perspective and from the algorithmic fairness perspective. Our main contributions include 1. Showing that if the objective is to maximize the efficiency of the classification process (defined as the accuracy of the outcome minus the sunk cost of the qualified players manipulating their features to gain a better outcome), then using randomized classifiers (that is, ones where the probability of a given feature vector to be accepted by the classifier is strictly between 0 and 1) is necessary. 2. Showing that in many natural cases, the imposed optimal solution (in terms of efficiency) has the structure where players never change their feature vectors (the randomized classifier is structured in a way, such that the gain in the probability of being classified as a 1 does not justify the expense of changing one's features). 3. Observing that the randomized classification is not a stable best-response from the classifier's viewpoint, and that the classifier doesn't benefit from randomized classifiers without creating instability in the system. 4. Showing that in some cases, a noisier signal leads to better equilibria outcomes -- improving both accuracy and fairness when more than one subpopulation with different feature adjustment costs are involved. This is interesting from a policy perspective, since it is hard to force institutions to stick to a particular randomized classification strategy (especially in a context of a market with multiple classifiers), but it is possible to alter the information environment to make the feature signals inherently noisier.Comment: 22 pages. Appeared in FORC, 202

    Information complexity is computable

    Get PDF
    The information complexity of a function ff is the minimum amount of information Alice and Bob need to exchange to compute the function ff. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function ff to within any additive error α>0\alpha > 0, thus resolving an open question as to whether information complexity is computable. In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of ff when restricted to bb-bit protocols to the (unrestricted) information complexity of ff.Comment: 30 page
    • …
    corecore